3 research outputs found

    Accelerating recurrent neural network training using sequence bucketing and multi-GPU data parallelization

    Full text link
    An efficient algorithm for recurrent neural network training is presented. The approach increases the training speed for tasks where a length of the input sequence may vary significantly. The proposed approach is based on the optimal batch bucketing by input sequence length and data parallelization on multiple graphical processing units. The baseline training performance without sequence bucketing is compared with the proposed solution for a different number of buckets. An example is given for the online handwriting recognition task using an LSTM recurrent neural network. The evaluation is performed in terms of the wall clock time, number of epochs, and validation loss value.Comment: 4 pages, 5 figures, Comments, 2016 IEEE First International Conference on Data Stream Mining & Processing (DSMP), Lviv, 201

    Modeling of Hepatitis B Epidemic Process by the Risk Factors Analysis

    Get PDF
    In this paper the model to study risk factors for hepatitis B and to identify the main causes affecting the incidence of hepatitis B was developed. Proposed model allows to identify the dependencies between the risk factors and the hepatitis B morbidity, detect major factors that affect the intensity of the epidemic process and verify the effectiveness of preventive measures. As a result the program was developed, which allows to improve the quality of management decisions at epidemiological surveillance
    corecore